On the Relation of Slow Feature Analysis and Laplacian Eigenmaps

نویسنده

  • Henning Sprekeler
چکیده

The past decade has seen a rise of interest in Laplacian eigenmaps (LEMs) for nonlinear dimensionality reduction. LEMs have been used in spectral clustering, in semisupervised learning, and for providing efficient state representations for reinforcement learning. Here, we show that LEMs are closely related to slow feature analysis (SFA), a biologically inspired, unsupervised learning algorithm originally designed for learning invariant visual representations. We show that SFA can be interpreted as a function approximation of LEMs, where the topological neighborhoods required for LEMs are implicitly defined by the temporal structure of the data. Based on this relation, we propose a generalization of SFA to arbitrary neighborhood relations and demonstrate its applicability for spectral clustering. Finally, we review previous work with the goal of providing a unifying view on SFA and LEMs.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Adaptive KNN Classification Based on Laplacian Eigenmaps and Kernel Mixtures

K Nearest Neighbor (kNN) is one of the most popular machine learning techniques, but it often fails to work well with inappropriate choice of distance metric or due to the presence of a lot of irrelated features. Linear and non-linear feature transformation methods have been applied to extract classrelevant information to improve kNN classification. In this paper, I describe kNN classification ...

متن کامل

Automated spike sorting algorithm based on Laplacian eigenmaps and k-means clustering.

This study presents a new automatic spike sorting method based on feature extraction by Laplacian eigenmaps combined with k-means clustering. The performance of the proposed method was compared against previously reported algorithms such as principal component analysis (PCA) and amplitude-based feature extraction. Two types of classifier (namely k-means and classification expectation-maximizati...

متن کامل

Dimension reduction for hyperspectral imaging using laplacian eigenmaps and randomized principal component analysis:Midyear Report

Hyperspectral imaging has attracted researchers’ interests in recent years. Because of its high dimensionality and complexity, the reduction of dimension of hyperspectral data sets has become crucial in processing and categorizing the data. In this project, I will apply two methods of dimension reduction: laplacian eigenmaps and randomized principal component analysis in order to reduce the dim...

متن کامل

Convergence of Laplacian Eigenmaps

Geometrically based methods for various tasks of data analysis have attracted considerable attention over the last few years. In many of these algorithms, a central role is played by the eigenvectors of the graph Laplacian of a data-derived graph. In this paper, we show that if points are sampled uniformly at random from an unknown submanifold M of RN , then the eigenvectors of a suitably const...

متن کامل

Discussion of "Spectral Dimensionality Reduction via Maximum Entropy"

Since the introduction of LLE (Roweis and Saul, 2000) and Isomap (Tenenbaum et al., 2000), a large number of non-linear dimensionality reduction techniques (manifold learners) have been proposed. Many of these non-linear techniques can be viewed as instantiations of Kernel PCA; they employ a cleverly designed kernel matrix that preserves local data structure in the “feature space” (Bengio et al...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Neural computation

دوره 23 12  شماره 

صفحات  -

تاریخ انتشار 2011